Generating rules from trained network using fast pruning
نویسندگان
چکیده
Before symbolic rules are extracted from a trained neural network, the network is usually pruned so as to obtain more concise rules. Typical pruning algorithms require retraining the network which incurs additional cost. This paper presents FERNN, a fast method for extracting rules from trained neural networks without network retraining. Given a fully connected trained feedforward network, FERNN first identifies the relevant hidden units by computing their information gains. Next, it identifies relevant connections from the input units to the relevant hidden units by checking the magnitudes of their weights. Finally, FERNN generates rules based on the relevant hidden units and weights. Our experimental results show that the size and accuracy of the tree generated are comparable to those extracted by another method which prunes and retrains the network.
منابع مشابه
Pruning recurrent neural networks for improved generalization performance
Determining the architecture of a neural network is an important issue for any learning task. For recurrent neural networks no general methods exist that permit the estimation of the number of layers of hidden neurons, the size of layers or the number of weights. We present a simple pruning heuristic that significantly improves the generalization performance of trained recurrent networks. We il...
متن کاملEffectively Generating Frequent Episode Rules for Anomaly-based Intrusion Detection*
Datamining is a useful tool for building classifiers to distinguish intrusive behavior from normal network traffic. In this paper, we provide new pruning techniques for the reduction of frequent episode rules to build anomaly-based intrusion detection systems (IDS). This reduction is crucial to use datamining for anomaly detection of unknown attacks. Otherwise, the rule search space may escalat...
متن کاملFast Generation of a Sequence of Trained and Validated Feed-Forward Networks
In this paper, three approaches are presented for generating and validating sequences of different size neural nets. First, a growing method is given along with several weight initialization methods, and their properties. Then a one pass pruning method is presented which utilizes orthogonal least squares. Based upon this pruning approach, a onepass validation method is discussed. Finally, a tra...
متن کاملA flow based pruning scheme for enumerative equitable coloring algorithms
An equitable graph coloring is a proper vertex coloring, such that the size of the color classes differ by at most one. We present a flow-based scheme for generating pruning rules for enumerative algorithms such as DSATUR. The scheme models the extendability of a partial (equitable) coloring into an equitable coloring via network flows. Computational experiments show that significant reductions...
متن کاملExtracting Rules from Neural Networks by Pruning and Hidden-Unit Splitting
An algorithm for extracting rules from a standard three-layer feedforward neural network is proposed. The trained network is first pruned not only to remove redundant connections in the network but, more important, to detect the relevant inputs. The algorithm generates rules from the pruned network by considering only a small number of activation values at the hidden units. If the number of inp...
متن کامل